Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 20 de 23
Filter
1.
Artificial Intelligence in Covid-19 ; : 193-228, 2022.
Article in English | Scopus | ID: covidwho-20231791

ABSTRACT

Forecasting epidemic dynamics has been an active area of research for at least two decades. The importance of the topic is evident: policy makers, citizens, and scientists would all like to get accurate and timely forecasts. In contrast to physical systems, the co-evolution of epidemics, individual and collective behavior, viral dynamics, and public policies make epidemic forecasting a problematic task. The situation is even more challenging during a pandemic as has become amply clear during the ongoing COVID-19 pandemic. Researchers worldwide have put in extraordinary efforts to try to forecast the time-varying evolution of the pandemic;despite their best efforts, it is fair to say that the results have been mixed. Several teams have done well on average but failed to forecast upsurges in the cases. In this chapter, we describe the state-of-the-art in epidemic forecasting, with a particular emphasis on forecasting during an ongoing pandemic. We describe a range of methods that have been developed and discuss the experience of our team in this context. We also summarize several challenges in producing accurate and timely forecasts. © The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2022.

2.
2022 IEEE International Conference on Big Data, Big Data 2022 ; : 1594-1603, 2022.
Article in English | Scopus | ID: covidwho-2248082

ABSTRACT

Real-time forecasting of non-stationary time series is a challenging problem, especially when the time series evolves rapidly. For such cases, it has been observed that ensemble models consisting of a diverse set of model classes can perform consistently better than individual models. In order to account for the nonstationarity of the data and the lack of availability of training examples, the models are retrained in real-time using the most recent observed data samples. Motivated by the robust performance properties of ensemble models, we developed a Bayesian model averaging ensemble technique consisting of statistical, deep learning, and compartmental models for fore-casting epidemiological signals, specifically, COVID-19 signals. We observed the epidemic dynamics go through several phases (waves). In our ensemble model, we observed that different model classes performed differently during the various phases. Armed with this understanding, in this paper, we propose a modification to the ensembling method to employ this phase information and use different weighting schemes for each phase to produce improved forecasts. However, predicting the phases of such time series is a significant challenge, especially when behavioral and immunological adaptations govern the evolution of the time series. We explore multiple datasets that can serve as leading indicators of trend changes and employ transfer entropy techniques to capture the relevant indicator. We propose a phase prediction algorithm to estimate the phases using the leading indicators. Using the knowledge of the estimated phase, we selectively sample the training data from similar phases. We evaluate our proposed methodology on our currently deployed COVID-19 forecasting model and the COVID-19 ForecastHub models. The overall performance of the proposed model is consistent across the pandemic. More importantly, it is ranked second during two critical rapid growth phases in cases, regimes where the performance of most models from the ForecastHub dropped significantly. © 2022 IEEE.

3.
Thirty-Sixth Aaai Conference on Artificial Intelligence / Thirty-Fourth Conference on Innovative Applications of Artificial Intelligence / Twelveth Symposium on Educational Advances in Artificial Intelligence ; : 12573-12579, 2022.
Article in English | Web of Science | ID: covidwho-2243280

ABSTRACT

The deployment of vaccines across the US provides significant defense against serious illness and death from COVID-19. Over 70% of vaccine-eligible Americans are at least partially vaccinated, but there are pockets of the population that are under-vaccinated, such as in rural areas and some demographic groups (e.g. age, race, ethnicity). These pockets are extremely susceptible to the Delta variant, exacerbating the healthcare crisis and increasing the risk of new variants. In this paper, we describe a data-driven model that provides real-time support to Virginia public health officials by recommending mobile vaccination site placement in order to target under-vaccinated populations. Our strategy uses fine-grained mobility data, along with US Census and vaccination uptake data, to identify locations that are most likely to be visited by unvaccinated individuals. We further extend our model to choose locations that maximize vaccine uptake among hesitant groups. We show that the top recommended sites vary substantially across some demographics, demonstrating the value of developing customized recommendation models that integrate fine-grained, heterogeneous data sources. We also validate our recommendations by analyzing the success rates of deployed vaccine sites, and show that sites placed closer to our recommended areas administered higher numbers of doses. Our model is the first of its kind to consider evolving mobility patterns in real-time for suggesting placement strategies customized for different targeted demographic groups.

4.
International Journal of High Performance Computing Applications ; 37(1):46478.0, 2023.
Article in English | Scopus | ID: covidwho-2239171

ABSTRACT

This paper describes an integrated, data-driven operational pipeline based on national agent-based models to support federal and state-level pandemic planning and response. The pipeline consists of (i) an automatic semantic-aware scheduling method that coordinates jobs across two separate high performance computing systems;(ii) a data pipeline to collect, integrate and organize national and county-level disaggregated data for initialization and post-simulation analysis;(iii) a digital twin of national social contact networks made up of 288 Million individuals and 12.6 Billion time-varying interactions covering the US states and DC;(iv) an extension of a parallel agent-based simulation model to study epidemic dynamics and associated interventions. This pipeline can run 400 replicates of national runs in less than 33 h, and reduces the need for human intervention, resulting in faster turnaround times and higher reliability and accuracy of the results. Scientifically, the work has led to significant advances in real-time epidemic sciences. © The Author(s) 2022.

5.
J Med Virol ; 2022 Oct 11.
Article in English | MEDLINE | ID: covidwho-2237548

ABSTRACT

Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) variants of concern (VOCs) have prolonged coronavirus disease 2019 (COVID-19) pandemic by escaping pre-existing immunity acquired by natural infection or vaccination. Elucidation of VOCs' mutation trends and evasion of neutralization is required to update current control measures. Mutations and the prevalence of VOCs were analyzed in the global immunization coverage rate context. Lentivirus-based pseudovirus neutralization analysis platforms for SARS-CoV-2 prototype strain (PS) and VOCs, containing Alpha, Beta, Gamma, Delta, and Omicron, were constructed based on the spike protein of each variant and HEK 293T cell line expressing the human angiotensin-converting enzyme 2 (hACE2) receptor on the surface, and an enhanced green fluorescent protein reporter. Serum samples from 65 convalescent individuals and 20 WIBP-CorV vaccine recipients and four therapeutic monoclonal antibodies (mAbs) namely imdevimab, casirivimab, bamlanivimab, and etesevimab were used to evaluate the neutralization potency against the variants. Pseudovirus-based neutralization assay platforms for PS and VOCs were established, and multiplicity of infection (MOI) was the key factor influencing the assay result. Compared to PS, VOCs may enhance the infectivity of hACE2-293T cells. Except for Alpha, other VOCs escaped neutralization to varying degrees. Attributed to favorable and emerging mutations, the current pandemic Omicron variant of all VOCs demonstrated the most significant neutralization-escaping ability to the sera and mAbs. Compared with the PS pseudovirus, Omicron had 15.7- and 3.71-fold decreases in the NT50 value (the highest serum dilution corresponding to a neutralization rate of 50%); and correspondingly, 90% and 43% of immunization or convalescent serum samples lost their neutralizing activity against the Omicron variant, respectively. Therefore, SARS-CoV-2 has evolved persistently with a strong ability to escape neutralization and prevailing against the established immune barrier. Our findings provide important clues to controlling the COVID-19 pandemic caused by new variants.

6.
7.
College and Research Libraries ; 83(6):946-965, 2022.
Article in English | Scopus | ID: covidwho-2120747

ABSTRACT

The COVID-19 pandemic forced organizations into rapid transition to virtual workplace settings. Librarians at the University of South Florida conducted a study to discover trends in team communication dynamics among academic librarians working remotely during this period. This study was motivated by a desire to gauge the perceived degree of positive or negative impact on group communication dynamics and connectedness before and after the transition, with attention paid to factors that inform team communication. This study used a quantitative approach employing a cross-sectional survey administered to the population of professional academic librarians in the United States. Survey findings exhibited small shifts in dynamics, opening a path for more nuanced examination. Effects on librarianship due to the pandemic are still being felt;it is a topic with long reach and impact, which merits examination. © 2022, Association of College and Research Libraries. All rights reserved.

8.
Clinical Toxicology ; 60(Supplement 2):133, 2022.
Article in English | EMBASE | ID: covidwho-2062724

ABSTRACT

Background: Metformin is the most commonly used diabetes medication and at supratherapeutic levels can result in a severe type-A metabolic lactic acidosis known as metformin-associated lactic acidosis (MALA). Treatment of MALA includes aggressive fluid resuscitation, supporting blood pressure and correcting acidosis. Renal replacement therapy (RRT), usually hemodialysis (HD) is recommended in severe cases with refractory acidosis (with elevated lactate), altered mental status, or shock. To our knowledge, this is the second report of metformin half-life during treatment with continuous veno-venous hemodiafiltration (CVVHDF). Case report: A 53-year-old man died following a reported acute on chronic ingestion of 80 g of his metformin tablets resulting in severe, refractory shock and MALA. His peak serum metformin concentration was 53mcg/mL (therapeutic range 1-2mcg/mL), peak lactic acid concentration was 49.7 mmol/L, and arterial pH nadir was 7.06. Serial serum metformin concentrations were obtained while on RRT;both HD and CVVHDF. The switch from HD to CVVHDF was done due to staffing shortages during the COVID-19 pandemic. The patient died despite aggressive therapy with renal replacement therapy and multiple vasopressors on hospital day five. Serial metformin concentrations during CVVHDF suggested a half-life of 33-h. Discussion(s): Hemodialysis has been reported to clear metformin at a rate greater than 200mL/min and continuous venous-venous hemofiltration (CVVH) at greater than 50mL/min. In this case, metformin levels appear to follow first-order elimination kinetics during CVVHDF with an estimated half-life of 33 h. Comparatively, metformin has a half-life of 4.7-5.5 h during HD. To our knowledge, this is the second report of estimated metformin half-life while using the CVVHDF form of continuous renal replacement. The previous case report measured a half-life of 16.5 h on CVVHDF. This case report shows CVVHDF decreases half-life of metformin and provides first order elimination in the setting of overdose. Conclusion(s): The early initiation of HD appears warranted but prognostic indicators have not been well established. In the absence of HD availability, other forms of RRT (e.g., CCVHDF) can be used and may provide first-order elimination of metformin.

9.
28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2022 ; : 4675-4683, 2022.
Article in English | Scopus | ID: covidwho-2020404

ABSTRACT

We study allocation of COVID-19 vaccines to individuals based on the structural properties of their underlying social contact network. Using a realistic representation of a social contact network for the Commonwealth of Virginia, we study how a limited number of vaccine doses can be strategically distributed to individuals to reduce the overall burden of the pandemic. We show that allocation of vaccines based on individuals' degree (number of social contacts) and total social proximity time is significantly more effective than the usually used age-based allocation strategy in reducing the number of infections, hospitalizations and deaths. The overall strategy is robust even: (i) if the social contacts are not estimated correctly;(ii) if the vaccine efficacy is lower than expected or only a single dose is given;(iii) if there is a delay in vaccine production and deployment;and (iv) whether or not non-pharmaceutical interventions continue as vaccines are deployed. For reasons of implementability, we have used degree, which is a simple structural measure and can be easily estimated using several methods, including the digital technology available today. These results are significant, especially for resource-poor countries, where vaccines are less available, have lower efficacy, and are more slowly distributed. © 2022 Owner/Author.

10.
Journal of Financial Service Professionals ; 76(5):46, 2022.
Article in English | ProQuest Central | ID: covidwho-2011722

ABSTRACT

The current article presents the findings of a quantitative study of 3,886 adults with disabilities and family members of people with disabilities to examine their financial health during the COVID-19 pandemic (March 2020–March 2021). This study provides hard data to highlight how the first year of the pandemic has exacerbated long-standing obstacles in servicing potential planning clients with disabilities. Participants assessed their overall financial well-being by completing a web-based survey to provide information about the state of their individual or family incomes and financial planning behaviors. Two key findings are germane to financial service planners and the special needs planning (SNP) industry: (1) three quarters of individuals reported that their incomes were negatively affected by the COVID-19 pandemic;and (2) nearly a third of people with disabilities and their family members have no financial literacy education or training in SNP matters. Collectively, these findings identify both a knowledge gap and a means gap that must be addressed to provide financial planning services to a larger portion of the disabilities market. The vast majority of individuals with disabilities desire, intend to obtain, or are already obtaining some form of financial literacy education or financial planning resources.

11.
J Vasc Surg Cases Innov Tech ; 8(2): 298-299, 2022 Jun.
Article in English | MEDLINE | ID: covidwho-1931016
12.
Annals of Behavioral Medicine ; 56(SUPP 1):S411-S411, 2022.
Article in English | Web of Science | ID: covidwho-1848472
13.
2022 zh Conference on Human Factors in Computing Systems, zh EA 2022 ; 2022.
Article in English | Scopus | ID: covidwho-1846564

ABSTRACT

In this paper, we explore how computing device use by people with upper extremity impairment (UEI) was affected by COVID-19. Someone with UEI has reduced use of their shoulders, upper arms, forearms, hands, and/or fingers. We conducted six (6) semi-structured interviews with participants with UEI in the US. We found that people with UEI increased computing device use during COVID-19 not only for remote interactions but also in person. Additionally, social distancing for COVID-19 safety created the need for new assistive technology (AT), authentication requirements, and communication platforms, which introduced their own accessibility barriers. We also found that personal protective equipment (PPE) created new barriers during computing device use, which often caused people with UEI to choose COVID-19 safety over the usability of their computing devices. Based on these findings, we describe future opportunities to make computing devices more accessible for people with UEI to manage the shifts in computing device use introduced by COVID-19. © 2022 ACM.

14.
Journal of vascular surgery cases and innovative techniques ; 2022.
Article in English | EuropePMC | ID: covidwho-1842748
15.
Open Forum Infectious Diseases ; 8(SUPPL 1):S104-S106, 2021.
Article in English | EMBASE | ID: covidwho-1746765

ABSTRACT

Background. The COVID-19 pandemic was associated with a significant (28%) reduction of methicillin-resistant Staphylococcus aureus (MRSA) acquisition at UVA Hospital (P=0.016). This "natural experiment" allowed us to analyze 3 key mechanisms by which the pandemic may have influenced nosocomial transmission: 1) enhanced infection control measures (i.e., barrier precautions and hand hygiene), 2) patient-level risk factors, and 3) networks of healthcare personnel (HCP)-mediated contacts. Hospital MRSA acquisition was defined as a new clinical or surveillance positive in patients with prior unknown or negative MRSA status occurring >72h after admission. 10 month time periods pre- (5/6/2019 to 2/23/2020) and post-COVID-19 (5/4/2020 to 2/28/2021) were chosen to mitigate the effects of seasonality. A 6-week wash-in period was utilized coinciding with the onset of several major hospital-wide infection control measures (opening of 2 special pathogen units with universal contact/airborne precautions on 4/1/21 and 5/1/21, universal mask 4/10/21 and eye protection 4/20/20 policies instituted along with staff education efforts including the importance of standard precautions). Box and whisker plots depict quartile ranges, median (dotted line), and mean values. Mean MRSA acquisition rates pre- (0.92 events per 1,000 patient days) significantly declined post-COVD-19 (to 0.66;P=0.016). Independent-samples t tests were used (2-tailed) for statistical comparisons except for variables without a normal distribution (Shorr Scores), for which a Mann-Whitney U test was used. Methods. Census-adjusted hospital-acquired MRSA acquisition events were analyzed over 10 months pre- (5/6/2019 to 2/23/2020) and post-COVD-19 (5/4/2020 to 2/28/2021), with a 6-week wash-in period coinciding with hospital-wide intensification of infection control measures (e.g., universal masking). HCP hand hygiene compliance rates were examined to reflect adherence to infection control practices. To examine impacts of non-infection control measures on MRSA transmission, we analyzed pre/post-COVD-19 differences in individual risk profiles for MRSA acquisition as well as a broad suite of properties of the hospital social network using person-location and person-person interactions inferred from the electronic medical record. Figure 2. Social Network Construction We constructed a contact network of hospitalized patients and staff at University of Virginia Hospital to analyze the properties of both person-location and person-person networks and their changes pre- and post-COVID-19. Colocation data (inferred from shared patient rooms and healthcare personnel (HCP)-patient interactions recorded in the electronic health record, e.g., medication administration) were used to construct contact networks, with nodes representing patients and HCP, and edges representing contacts. The above schematic shows how the temporal networks are inferred. In the figure, circles represent patients and the small filled squares represent HCP, while the larger rectangles represent patient rooms. The first room is a shared room with two patients. At each time step, co-location is inferred from the EMR data, which specifies interactions between HCP and patients. This can be represented as the temporal network (t) at the bottom. Results. Hand hygiene compliance significantly improved post-COVD-19, in parallel with other infection control measures. Patient Shorr Scores (an index of individual MRSA risk) were statistically similar pre-/post-COVD-19. Analysis of various network properties demonstrated no trends to suggest a reduced outbreak threshold post-COVD-19. Figure 3. Hand Hygiene Compliance Rates Analysis of hospital-wide hand hygiene auditing data (anonymous auditors deployed to various units across UVA Hospital with an average 1,710 observations per month (range 340 - 7,187)) demonstrated a statistically significant (6%) improvement in average monthly hand hygiene compliance (86.9% pre- versus 93.1% post-COVD-19;P=0.008). Figure 4. Individual MRSA Risk Factors We calculated the Shorr Score (a validated tool to estimate individual risk for MRSA carriage in hospitalized patients;Shorr et al. Arch Intern Med. 2008;168(20):2205-10) for patients using data from the electronic health record to test the hypothesis that individual risk factors in aggregate did not change significantly in the post-COVD-19 period to explain changes in MRSA acquisition. Values for this score ranged from 0 to 10 with the following criteria: recent hospitalization (4), nursing home residence (3), hemodialysis (2), ICU admission (1). Pictured are frequency distributions of Shorr scores in the pre-COVID-19 and post-COVID-19 periods. The Mann-Whitney effect size (E), 0.53 (P=0.51), indicated that pre- and post-COVD-19 distributions were very similar. We analyzed three major types of network properties for this analysis: (1) Node properties of the pre- and post-COVID-19 networks consisted of all the edges in the pre- and post-COVID-19 periods, respectively. We considered a number of standard properties used in social network analysis to quantify opportunities for patient-patient transmission: degree centrality (links held by each node), betweenness centrality (times each node acts as the shortest 'bridge' between two other nodes), closeness centrality (how close each node is to other nodes in network), Eigenvector centrality (node's relative influence on the network), and clustering coefficient (degree to which nodes cluster together) in the first five panels (left to right, top to bottom);(Newman, Networks: An Introduction, 2010). Each panel shows the frequency distributions of these properties. These properties generally did not have a normal distribution and therefore we used a Mann Whitney U test on random subsets of nodes in these networks to compare pre- and post-COVID properties. The mean effect size (E) and P-values are shown for each metric in parenthesis. We concluded that all of these pre- versus post-COVID-19 network properties were statistically similar. (2) Properties of the ego networks (networks induced by each node and its 'one-hop' neighbors). We considered density (average number of neighbors for each node;higher density generally favors lower outbreak threshold) and degree centrality (number of links held by each node) of ego networks (middle right and bottom left panels). The mean effect size and p-values using the Mann Whitney test are shown in parenthesis;there were no statistically significant differences in these properties in the pre- and post-COVID networks. (3) Aggregate properties of the weekly networks, consisting of all the interactions within a week. We considered modularity (measure of how the community structure differs from a random network;higher modularity means a stronger community structure and lower likelihood of transmission) and density (average number of neighbors each node;higher density generally favors lower outbreak threshold) of the weekly networks (bottom middle and bottom right panels). The modularity in the post-COVID weekly networks was slightly lower (i.e., it has a weaker community structure, and the network is more well mixed), while density was slightly higher, the differences of which were statistically significant;a caveat is that these are relatively small datasets (about 40 weeks). These differences (higher density, and better connectivity) both increase the risk of transmission in the post-COVID networks. In summary, the post-COVID networks either have similar properties as the pre-COVID networks, or had changes which are unlikely to have played a role in reducing MRSA transmission. Conclusion. A significant reduction in post-COVD-19 MRSA transmission may have been an unintended positive effect of enhanced infection control measures, particularly hand hygiene and increased mask use. A modest (11.6%) post-COVD-19 reduction in surveillance testing may have also played a role. Despite pandemic-related cohorting and census fluctuations, most network properties were not significantly different post-COVID-19, except for aggregate density and modularity which varied in a directio that instead favored transmission;therefore, HCP-based networks did not play a significant role in reducing MRSA transmission. Multivariate modeling to isolate relative contributions of these factors is underway. Figure 6. Surveillance Testing and Clinical Culturing Post-COVD-19, there was a modest (11.6%) but statistically significant reduction in surveillance PCR testing (42.4 mean tests per 1,000 patient days pre- versus 37.5 post-COVD-19;P<0.002). There was not a statistically significant difference in rates of clinical cultures sent (2.48 cultures per 1,000 patient days pre- versus 2.23 post-COVD-19;P=0.288).

16.
2021 Winter Simulation Conference, WSC 2021 ; 2021-December, 2021.
Article in English | Scopus | ID: covidwho-1746022

ABSTRACT

Contact tracing (CT) is an important and effective intervention strategy for controlling an epidemic. Its role becomes critical when pharmaceutical interventions are unavailable. CT is resource intensive, and multiple protocols are possible, therefore the ability to evaluate strategies is important. We describe a high-performance, agent-based simulation model for studying CT during an ongoing pandemic. This work was motivated by the COVID-19 pandemic, however framework and design are generic and can be applied in other settings. This work extends our HPC-oriented ABM framework EpiHiper to efficiently represent contact tracing. The main contributions are: (i) Extension of EpiHiper to represent realistic CT processes. (ii) Realistic case study using the VA network motivated by our collaboration with the Virginia Department of Health. © 2021 IEEE.

17.
2021 IEEE International Conference on Big Data, Big Data 2021 ; : 1566-1574, 2021.
Article in English | Scopus | ID: covidwho-1730887

ABSTRACT

We study the role of vaccine acceptance in controlling the spread of COVID-19 in the US using AI-driven agent-based models. Our study uses a 288 million node social contact network spanning all 50 US states plus Washington DC, comprised of 3300 counties, with 12.59 billion daily interactions. The highly-resolved agent-based models use realistic information about disease progression, vaccine uptake, production schedules, acceptance trends, prevalence, and social distancing guidelines. Developing a national model at this resolution that is driven by realistic data requires a complex scalable workflow, model calibration, simulation, and analytics components. Our workflow optimizes the total execution time and helps in improving overall human productivity.This work develops a pipeline that can execute US-scale models and associated workflows that typically present significant big data challenges. Our results show that, when compared to faster and accelerating vaccinations, slower vaccination rates due to vaccine hesitancy cause averted infections to drop from 6.7M to 4.5M, and averted total deaths to drop from 39.4K to 28.2K nationwide. This occurs despite the fact that the final vaccine coverage is the same in both scenarios. Improving vaccine acceptance by 10% in all states increases averted infections from 4.5M to 4.7M (a 4.4% improvement) and total deaths from 28.2K to 29.9K (a 6% increase) nationwide. The analysis also reveals interesting spatio-temporal differences in COVID-19 dynamics as a result of vaccine acceptance. To our knowledge, this is the first national-scale analysis of the effect of vaccine acceptance on the spread of COVID-19, using detailed and realistic agent-based models. © 2021 IEEE.

18.
27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2021 ; : 2505-2513, 2021.
Article in English | Scopus | ID: covidwho-1430226

ABSTRACT

Timely, high-resolution forecasts of infectious disease incidence are useful for policy makers in deciding intervention measures and estimating healthcare resource burden. In this paper, we consider the task of forecasting COVID-19 confirmed cases at the county level for the United States. Although multiple methods have been explored for this task, their performance has varied across space and time due to noisy data and the inherent dynamic nature of the pandemic. We present a forecasting pipeline which incorporates probabilistic forecasts from multiple statistical, machine learning and mechanistic methods through a Bayesian ensembling scheme, and has been operational for nearly 6 months serving local, state and federal policymakers in the United States. While showing that the Bayesian ensemble is at least as good as the individual methods, we also show that each individual method contributes significantly for different spatial regions and time points. We compare our model's performance with other similar models being integrated into CDC-initiated COVID-19 Forecast Hub, and show better performance at longer forecast horizons. Finally, we also describe how such forecasts are used to increase lead time for training mechanistic scenario projections. Our work demonstrates that such a real-time high resolution forecasting pipeline can be developed by integrating multiple methods within a performance-based ensemble to support pandemic response. © 2021 Owner/Author.

19.
27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2021 ; : 2632-2642, 2021.
Article in English | Scopus | ID: covidwho-1430224

ABSTRACT

Mobility restrictions have been a primary intervention for controlling the spread of COVID-19, but they also place a significant economic burden on individuals and businesses. To balance these competing demands, policymakers need analytical tools to assess the costs and benefits of different mobility reduction measures. In this paper, we present our work motivated by our interactions with the Virginia Department of Health on a decision-support tool that utilizes large-scale data and epidemiological modeling to quantify the impact of changes in mobility on infection rates. Our model captures the spread of COVID-19 by using a fine-grained, dynamic mobility network that encodes the hourly movements of people from neighborhoods to individual places, with over 3 billion hourly edges. By perturbing the mobility network, we can simulate a wide variety of reopening plans and forecast their impact in terms of new infections and the loss in visits per sector. To deploy this model in practice, we built a robust computational infrastructure to support running millions of model realizations, and we worked with policymakers to develop an interactive dashboard that communicates our model's predictions for thousands of potential policies. © 2021 ACM.

20.
35th IEEE International Parallel and Distributed Processing Symposium, IPDPS 2021 ; : 639-650, 2021.
Article in English | Scopus | ID: covidwho-1393745

ABSTRACT

The COVID-19 global outbreak represents the most significant epidemic event since the 1918 influenza pandemic. Simulations have played a crucial role in supporting COVID-19 planning and response efforts. Developing scalable workflows to provide policymakers quick responses to important questions pertaining to logistics, resource allocation, epidemic forecasts and intervention analysis remains a challenging computational problem. In this work, we present scalable high performance computing-enabled workflows for COVID-19 pandemic planning and response. The scalability of our methodology allows us to run fine-grained simulations daily, and to generate county-level forecasts and other counterfactual analysis for each of the 50 states (and DC), 3140 counties across the USA. Our workflows use a hybrid cloud/cluster system utilizing a combination of local and remote cluster computing facilities, and using over 20, 000 CPU cores running for 6-9 hours every day to meet this objective. Our state (Virginia), state hospital network, our university, the DOD and the CDC use our models to guide their COVID-19 planning and response efforts. We began executing these pipelines March 25, 2020, and have delivered and briefed weekly updates to these stakeholders for over 30 weeks without interruption. © 2021 IEEE.

SELECTION OF CITATIONS
SEARCH DETAIL